1,753 research outputs found
Optimal Design of Multiple Description Lattice Vector Quantizers
In the design of multiple description lattice vector quantizers (MDLVQ),
index assignment plays a critical role. In addition, one also needs to choose
the Voronoi cell size of the central lattice v, the sublattice index N, and the
number of side descriptions K to minimize the expected MDLVQ distortion, given
the total entropy rate of all side descriptions Rt and description loss
probability p. In this paper we propose a linear-time MDLVQ index assignment
algorithm for any K >= 2 balanced descriptions in any dimensions, based on a
new construction of so-called K-fraction lattice. The algorithm is greedy in
nature but is proven to be asymptotically (N -> infinity) optimal for any K >=
2 balanced descriptions in any dimensions, given Rt and p. The result is
stronger when K = 2: the optimality holds for finite N as well, under some mild
conditions. For K > 2, a local adjustment algorithm is developed to augment the
greedy index assignment, and conjectured to be optimal for finite N.
Our algorithmic study also leads to better understanding of v, N and K in
optimal MDLVQ design. For K = 2 we derive, for the first time, a
non-asymptotical closed form expression of the expected distortion of optimal
MDLVQ in p, Rt, N. For K > 2, we tighten the current asymptotic formula of the
expected distortion, relating the optimal values of N and K to p and Rt more
precisely.Comment: Submitted to IEEE Trans. on Information Theory, Sep 2006 (30 pages, 7
figures
Cognitive Deficit of Deep Learning in Numerosity
Subitizing, or the sense of small natural numbers, is an innate cognitive
function of humans and primates; it responds to visual stimuli prior to the
development of any symbolic skills, language or arithmetic. Given successes of
deep learning (DL) in tasks of visual intelligence and given the primitivity of
number sense, a tantalizing question is whether DL can comprehend numbers and
perform subitizing. But somewhat disappointingly, extensive experiments of the
type of cognitive psychology demonstrate that the examples-driven black box DL
cannot see through superficial variations in visual representations and distill
the abstract notion of natural number, a task that children perform with high
accuracy and confidence. The failure is apparently due to the learning method
not the CNN computational machinery itself. A recurrent neural network capable
of subitizing does exist, which we construct by encoding a mechanism of
mathematical morphology into the CNN convolutional kernels. Also, we
investigate, using subitizing as a test bed, the ways to aid the black box DL
by cognitive priors derived from human insight. Our findings are mixed and
interesting, pointing to both cognitive deficit of pure DL, and some measured
successes of boosting DL by predetermined cognitive implements. This case study
of DL in cognitive computing is meaningful for visual numerosity represents a
minimum level of human intelligence.Comment: Accepted for presentation at the AAAI-1
- …